A Process-Aware Memory Compact-Device Model Using Long-Short Term Memory

نویسندگان

چکیده

With the immense increase in processing data during scaling down of semiconductor devices by Moore's Law, it is urgent need to use analytics meet state art performance both manufacturing and device compact modeling. In particular, managing fabrication cost promptly providing models, especially for new or emerging devices, challenging. To ease out these issues, we propose a unified, general-purpose, process-aware machine learning (ML) based model (CM) resistive random-access memory (RRAM), same methodology can be used any with hysteresis. A long short-term (LSTM) ML fit RRAM current-voltage (I-V) characteristics. The memorizing capability LSTM ensures one low resistance (LRS) high (HRS). fitted dataset on fabricated samples using TaN/HfO 2 /Pt/Ti/SiO /Si structure. resultant fitting error 0.0096 sinusoidal wave input voltage 0.0148 random walk sequences. demonstration, post-oxide annealing from 300°C 500°C. root mean squared (RMSE) 0.0028. Thus, LSTM-based CM has potential compete conventional models terms shorter developing time, better large number easily incorporated unified accounting LRS HRS ensuring differentiability. We that useful intelligent manufacturing, process tuning, simulation program integrated circuit emphasis (SPICE) modeling simulation.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Long Short-term Memory

Model compression is significant for the wide adoption of Recurrent Neural Networks (RNNs) in both user devices possessing limited resources and business clusters requiring quick responses to large-scale service requests. This work aims to learn structurally-sparse Long Short-Term Memory (LSTM) by reducing the sizes of basic structures within LSTM units, including input updates, gates, hidden s...

متن کامل

Long Short-Term Memory

Learning to store information over extended time intervals by recurrent backpropagation takes a very long time, mostly because of insufficient, decaying error backflow. We briefly review Hochreiter's (1991) analysis of this problem, then address it by introducing a novel, efficient, gradient-based method called long short-term memory (LSTM). Truncating the gradient where this does not do harm, ...

متن کامل

Speech dereverberation using long short-term memory

Recently, neural networks have been used for not only phone recognition but also denoising and dereverberation. However, the conventional denoising deep autoencoder (DAE) based on the feed-forward structure is not capable of handling very long speech frames of reverberation. LSTM can be effectively trained to reduce the average error between the enhanced signal and the original clean signal by ...

متن کامل

the effects of keyword and context methods on pronunciation and receptive/ productive vocabulary of low-intermediate iranian efl learners: short-term and long-term memory in focus

از گذشته تا کنون، تحقیقات بسیاری صورت گرفته است که همگی به گونه ای بر مثمر ثمر بودن استفاده از استراتژی های یادگیری لغت در یک زبان بیگانه اذعان داشته اند. این تحقیق به بررسی تاثیر دو روش مختلف آموزش واژگان انگلیسی (کلیدی و بافتی) بر تلفظ و دانش لغوی فراگیران ایرانی زیر متوسط زبان انگلیسی و بر ماندگاری آن در حافظه می پردازد. به این منظور، تعداد شصت نفر از زبان آموزان ایرانی هشت تا چهارده ساله با...

15 صفحه اول

Grid Long Short-Term Memory

This paper introduces Grid Long Short-Term Memory, a network of LSTM cells arranged in a multidimensional grid that can be applied to vectors, sequences or higher dimensional data such as images. The network differs from existing deep LSTM architectures in that the cells are connected between network layers as well as along the spatiotemporal dimensions of the data. The network provides a unifi...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: IEEE Access

سال: 2021

ISSN: ['2169-3536']

DOI: https://doi.org/10.1109/access.2020.3047491